We derive a synaptic weight update rule for learning temporally precise spiketrain to spike train transformations in multilayer feedforward networks ofspiking neurons. The framework, aimed at seamlessly generalizing errorbackpropagation to the deterministic spiking neuron setting, is based strictlyon spike timing and avoids invoking concepts pertaining to spike rates orprobabilistic models of spiking. The derivation is founded on two innovations.First, an error functional is proposed that compares the spike train emitted bythe output neuron of the network to the desired spike train by way of theirputative impact on a virtual postsynaptic neuron. This formulation sidestepsthe need for spike alignment and leads to closed form solutions for allquantities of interest. Second, virtual assignment of weights to spikes ratherthan synapses enables a perturbation analysis of individual spike times andsynaptic weights of the output as well as all intermediate neurons in thenetwork, which yields the gradients of the error functional with respect to thesaid entities. Learning proceeds via a gradient descent mechanism thatleverages these quantities. Simulation experiments demonstrate the efficacy ofthe proposed learning framework. The experiments also highlight asymmetriesbetween synapses on excitatory and inhibitory neurons.
展开▼